Online Optimization with Gradual Variations

نویسندگان

  • Chao-Kai Chiang
  • Tianbao Yang
  • Chia-Jung Lee
  • Mehrdad Mahdavi
  • Chi-Jen Lu
  • Rong Jin
  • Shenghuo Zhu
چکیده

We study the online convex optimization problem, in which an online algorithm has to make repeated decisions with convex loss functions and hopes to achieve a small regret. We consider a natural restriction of this problem in which the loss functions have a small deviation, measured by the sum of the distances between every two consecutive loss functions, according to some distance metrics. We show that for the linear and general smooth convex loss functions, an online algorithm modified from the gradient descend algorithm can achieve a regret which only scales as the square root of the deviation. For the closely related problem of prediction with expert advice, we show that an online algorithm modified from the multiplicative update algorithm can also achieve a similar regret bound for a different measure of deviation. Finally, for loss functions which are strictly convex, we show that an online algorithm modified from the online Newton step algorithm can achieve a regret which is only logarithmic in terms of the deviation, and as an application, we can also have such a logarithmic regret for the portfolio management problem.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Commentary on "Online Optimization with Gradual Variations"

This commentary is about (Chiang et al., 2012b). This paper is the result of a merge between two papers, (Yang et al., 2012) and (Chiang et al., 2012a). Both papers address the same question: is it possible to obtain regret bounds in various online learning settings that depend on some notion of variation in the costs, rather than the number of periods? Both papers give remarkably similar algor...

متن کامل

A New Fuzzy Stabilizer Based on Online Learning Algorithm for Damping of Low-Frequency Oscillations

A multi objective Honey Bee Mating Optimization (HBMO) designed by online learning mechanism is proposed in this paper to optimize the double Fuzzy-Lead-Lag (FLL) stabilizer parameters in order to improve low-frequency oscillations in a multi machine power system. The proposed double FLL stabilizer consists of a low pass filter and two fuzzy logic controllers whose parameters can be set by the ...

متن کامل

Online Distribution and Load Balancing Optimization Using the Robin Hood and Johnson Hybrid Algorithm

Proper planning of assembly lines is one of the production managers’ concerns at the tactical level so that it would be possible to use the machine capacity, reduce operating costs and deliver customer orders on time. The lack of an efficient method in balancing assembly line can create threatening problems for manufacturing organizations. The use of assembly line balancing methods cannot balan...

متن کامل

Optimization of Online induction Sensor for Ferrous Metals Particles Identification in Engine Oil

Engine oil is one of most important parameters in internal combustion engine that plays effective role in component wear. One of the ways to optimize the performance of the IC engines is online monitoring of wear particle in engine oil. There are different ways to identifying these particles, most of which are offline. Nowadays online oil monitoring sensors are quickly developed. In this study ...

متن کامل

A Self-organizing Multi-agent System for Online Unsupervised Learning in Complex Dynamic Environments

The task of continuous online unsupervised learning of streaming data in complex dynamic environments under conditions of uncertainty is an NP-hard optimization problem for general metric spaces. This paper describes a computationally efficient adaptive multi-agent approach to continuous online clustering of streaming data, which is originally sensitive to environmental variations and provides ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012